Fastly Compute: High-Performance Edge with WebAssembly

Abstracción geométrica de red de datos en tonos azul oscuro

Fastly Compute (formerly Compute@Edge) is Fastly’s bet on serious edge computing. While Cloudflare Workers uses V8 isolates and Workers KV, Fastly chose pure WebAssembly on its own Lucet/Wasmtime runtime. Two different philosophies with implications for performance, portability, and ergonomics. This article compares both honestly and explains when Fastly is the right choice.

Fastly’s Technical Bet

Fastly decided not to build its own JavaScript runtime or lean on Node. It went with WebAssembly as target:

  • Your code (Rust, Go, AssemblyScript, JS via Javy) compiles to .wasm.
  • Fastly deploys the binary across ~70 global PoPs.
  • Each request invokes an ephemeral instance of the Wasm module.
  • Cold start: 35 microseconds (μs, not ms) — Fastly leans on this.

The number is real but deserves context: Workers also has sub-5ms cold start. The 1ms→35μs difference is invisible to users; but it does let you run thousands of functions per request without overhead.

Supported Languages

Officially:

  • Rust: first-class citizen. Most mature tooling (fastly crate), better docs.
  • JavaScript: supports JS + TypeScript subset. Runs via Javy — JS → Wasm ahead-of-time.
  • Go: supported via TinyGo. Limitations vs standard Go.
  • AssemblyScript: TypeScript subset that compiles directly to Wasm.

For new projects, Rust is the recommendation. For pure-JS teams, support is good but doesn’t match Workers’ ergonomy.

Execution Model

Each request:

  1. Arrives at the nearest PoP.
  2. Wasm module is instantiated (35μs cold start).
  3. Handler executes.
  4. Instance is destroyed (stateless execution by default).
  5. Responds.

No shared state between requests within the same process — Fastly prioritises determinism and isolation.

Rust Hello World

use fastly::http::{Method, StatusCode};
use fastly::{Error, Request, Response};

#[fastly::main]
fn main(req: Request) -> Result<Response, Error> {
    match (req.get_method(), req.get_path()) {
        (&Method::GET, "/") => Ok(Response::from_status(StatusCode::OK)
            .with_body("Hi from Fastly Compute")),
        _ => Ok(Response::from_status(StatusCode::NOT_FOUND)),
    }
}

Deploy:

fastly compute publish

The CLI workflow is polished. Local builds, preview deployments, rollback — all with fastly.

Fastly Compute vs Cloudflare Workers

Honest comparison:

Aspect Fastly Compute Cloudflare Workers
Runtime WebAssembly (Wasmtime) V8 isolates
Cold start 35μs ~1-5ms
Main language Rust, JS JS, TypeScript
PoPs ~70 ~300+
KV/store KV Store (simple) Workers KV (sophisticated), D1, R2
Entry price $50/mo $5/mo
DX Good (Rust) Excellent (JS)
Portability Standard Wasm CF-locked

For compute-intensive edge code in Rust/Go, Fastly performs better. For JS/TS apps with big ecosystem, Workers is more productive. Cloudflare has advantage in PoP count and entry pricing.

Where Fastly Shines

Scenarios where Fastly wins:

  • Portable Rust code. If you already have critical logic in Rust, porting is natural.
  • Compliance and control. Fastly is perceived as more “enterprise-grade”, with mature compliance.
  • Fastly CDN integration. For Fastly customers already, Compute is natural evolution.
  • Predictable performance. Sub-ms cold start matters for high-concurrency workloads.
  • Serious custom edge. Complex A/B tests, rewriting, image transforms.

Where Workers Shines

By contrast:

  • JS ecosystem. npm packages work directly (with some adaptations).
  • Durable Objects + KV + D1: Workers has more free pieces.
  • Entry pricing: starting is cheaper.
  • More PoPs: better coverage in some regions.
  • Community: more examples, public tutorials.

Real Cases

Common patterns with Fastly:

  • API gateway: validate auth, transform requests, route by feature flags.
  • Image optimization: resize and optimise images on-the-fly.
  • HTML rewriting: inject banners, personalise content.
  • A/B testing: split traffic with deterministic logic.
  • Geo-based routing: serve different content by country/city.

For pure static content, a normal CDN suffices. Compute is for edge logic.

Realistic Costs

Fastly Compute:

  • Essential plan: $50/mo includes 1M requests.
  • Additional requests: $0.50 / 1M.
  • Compute time: separate charge by ms used.

Cloudflare Workers:

  • Free: 100k requests/day.
  • Paid: $5/mo includes 10M requests.
  • Additional: $0.30 / 1M.

For low loads, Workers is clearly cheaper. For high loads with complex logic, Fastly can be competitive via Wasm runtime efficiency.

Current Limitations

Honestly:

  • Limited binary size (~100MB). For large Wasm-compiled apps, may fall short.
  • Max time per request: ~60s.
  • No persistent native WebSockets (Workers Durable Objects wins here).
  • Limited debugging: Wasm stack traces aren’t as clear as JS devtools.
  • Less community context and examples than Workers.

WebAssembly at Edge: The Trend

Beyond Fastly:

  • Cloudflare has Workers for Platforms with optional Wasm support.
  • wasmer-edge bets on distributed edge wasm.
  • shuttle.rs — Rust full-stack with edge deploy.
  • Suborbital (now acquired) — edge functions via Wasm.

The “Wasm + edge” pattern is consolidating. Fastly was a pioneer; others follow.

Migrating from Workers

If you consider moving from Workers to Fastly:

  • JavaScript code: moderate rework. Different runtime (no Node, no V8-specific).
  • CF-specific features (Durable Objects, D1, KV): replace with external patterns (Redis, PostgreSQL, S3).
  • Deploy pipelines: wranglerfastly (conceptually similar).
  • Performance testing: validate 35μs cold start is noticeable in your case.

For most it’s not worth migrating just for Fastly; but it’s a good option for new projects where Rust/Wasm is natural choice.

Conclusion

Fastly Compute is a serious technical bet on WebAssembly at the edge. For compute-intensive workloads, complex Rust logic, or teams already on Fastly CDN, it’s the more performant option. For the median case of “edge JS logic with npm ecosystem and simplicity”, Cloudflare Workers remains more practical. The choice isn’t ideological — it’s contextual. What both represent is the real convergence of “serverless function at the edge”, increasingly capable of replacing traditional-infrastructure bricks.

Follow us on jacar.es for more on edge computing, WebAssembly, and distributed architectures.

Entradas relacionadas